Asynchronous level bundle methods
نویسندگان
چکیده
منابع مشابه
Level bundle methods for oracles with on-demand accuracy
For nonsmooth convex optimization, we consider level bundle methods built using an oracle that computes values for the objective function and a subgradient at any given feasible point. For the problems of interest, the exact oracle information is computable, but difficult to obtain. In order to save computational effort the oracle can provide estimations with an accuracy that depends on two add...
متن کاملLevel bundle methods for constrained convex optimization with various oracles
We propose restricted memory level bundle methods for minimizing constrained convex nonsmooth optimization problems whose objective and constraint functions are known through oracles (black-boxes) that might provide inexact information. Our approach is general and covers many instances of inexact oracles, such as upper, lower and on-demand accuracy oracles. We show that the proposed level bundl...
متن کاملDynamic bundle methods
Lagrangian relaxation is a popular technique to solvedifficult optimization problems. However, the applicability of this technique depends on having a relatively low number of hard constraints to dualize. When there are many hard constraints, it may be preferable to relax them dynamically, according to some rule depending on which multipliers are active. From the dual point of view, this approa...
متن کاملGeneralized Bundle Methods
We study a class of generalized bundle methods where the stabilizing term can be any closed convex function satisfying certain properties. This setting covers several algorithms from the literature that have been so far regarded as distinct. Under different hypothesis on the stabilizing term and/or the function to be minimized, we prove finite termination, asymptotic convergence and finite conv...
متن کاملFunctional Bundle Methods
Recently, gradient descent based optimization procedures and their functional gradient based boosting generalizations have shown strong performance across a number of convex machine learning formulations. They are particularly alluring for structured prediction problems due to their low memory requirements [5], and recent theoretical work has show that they converge fast across a wide range of ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Mathematical Programming
سال: 2019
ISSN: 0025-5610,1436-4646
DOI: 10.1007/s10107-019-01414-y